Current:Home > ContactOregon is dropping an artificial intelligence tool used in child welfare system -FundSphere
Oregon is dropping an artificial intelligence tool used in child welfare system
View
Date:2025-04-25 16:27:47
Child welfare officials in Oregon will stop using an algorithm to help decide which families are investigated by social workers, opting instead for a new process that officials say will make better, more racially equitable decisions.
The move comes weeks after an Associated Press review of a separate algorithmic tool in Pennsylvania that had originally inspired Oregon officials to develop their model, and was found to have flagged a disproportionate number of Black children for "mandatory" neglect investigations when it first was in place.
Oregon's Department of Human Services announced to staff via email last month that after "extensive analysis" the agency's hotline workers would stop using the algorithm at the end of June to reduce disparities concerning which families are investigated for child abuse and neglect by child protective services.
"We are committed to continuous quality improvement and equity," Lacey Andresen, the agency's deputy director, said in the May 19 email.
Jake Sunderland, a department spokesman, said the existing algorithm would "no longer be necessary," since it can't be used with the state's new screening process. He declined to provide further details about why Oregon decided to replace the algorithm and would not elaborate on any related disparities that influenced the policy change.
Hotline workers' decisions about reports of child abuse and neglect mark a critical moment in the investigations process, when social workers first decide if families should face state intervention. The stakes are high – not attending to an allegation could end with a child's death, but scrutinizing a family's life could set them up for separation.
Algorithms raise concerns about racial disparities
From California to Colorado and Pennsylvania, as child welfare agencies use or consider implementing algorithms, an AP review identified concerns about transparency, reliability and racial disparities in the use of the technology, including their potential to harden bias in the child welfare system.
U.S. Sen. Ron Wyden, an Oregon Democrat, said he had long been concerned about the algorithms used by his state's child welfare system and reached out to the department again following the AP story to ask questions about racial bias – a prevailing concern with the growing use of artificial intelligence tools in child protective services.
"Making decisions about what should happen to children and families is far too important a task to give untested algorithms," Wyden said in a statement. "I'm glad the Oregon Department of Human Services is taking the concerns I raised about racial bias seriously and is pausing the use of its screening tool."
Sunderland said Oregon child welfare officials had long been considering changing their investigations process before making the announcement last month.
He added that the state decided recently that the algorithm would be completely replaced by its new program, called the Structured Decision Making model, which aligns with many other child welfare jurisdictions across the country.
Oregon's Safety at Screening Tool was inspired by the influential Allegheny Family Screening Tool, which is named for the county surrounding Pittsburgh, and is aimed at predicting the risk that children face of winding up in foster care or being investigated in the future. It was first implemented in 2018. Social workers view the numerical risk scores the algorithm generates – the higher the number, the greater the risk – as they decide if a different social worker should go out to investigate the family.
But Oregon officials tweaked their original algorithm to only draw from internal child welfare data in calculating a family's risk, and tried to deliberately address racial bias in its design with a "fairness correction."
In response to Carnegie Mellon University researchers' findings that Allegheny County's algorithm initially flagged a disproportionate number of Black families for "mandatory" child neglect investigations, county officials called the research "hypothetical," and noted that social workers can always override the tool, which was never intended to be used on its own.
Sen. Wyden backs national oversight of technology used in the child welfare system
Wyden is a chief sponsor of a bill that seeks to establish transparency and national oversight of software, algorithms and other automated systems.
"With the livelihoods and safety of children and families at stake, technology used by the state must be equitable — and I will continue to watchdog," Wyden said.
The second tool that Oregon developed – an algorithm to help decide when foster care children can be reunified with their families – remains on hiatus as researchers rework the model. Sunderland said the pilot was paused months ago due to inadequate data but that there is "no expectation that it will be unpaused soon."
In recent years while under scrutiny by a crisis oversight board ordered by the governor, the state agency – currently preparing to hire its eighth new child welfare director in six years – considered three additional algorithms, including predictive models that sought to assess a child's risk for death and severe injury, whether children should be placed in foster care, and if so, where. Sunderland said the child welfare department never built those tools, however.
veryGood! (63)
Related
- 'Malcolm in the Middle’ to return with new episodes featuring Frankie Muniz
- Tyreek Hill of Miami Dolphins named No. 1 in 'Top 100 Players of 2024' countdown
- Cameron McEvoy is the world's fastest swimmer, wins 50 free
- Stephen ‘Pommel Horse Guy’ Nedoroscik adds another bronze medal to his Olympic tally
- Tropical rains flood homes in an inland Georgia neighborhood for the second time since 2016
- About half of US state AGs went on France trip sponsored by group with lobbyist and corporate funds
- Aerosmith retires from touring, citing permanent damage to Steven Tyler’s voice last year
- Monday through Friday, business casual reigns in US offices. Here's how to make it work.
- Illinois Gov. Pritzker calls for sheriff to resign after Sonya Massey shooting
- Emily Bader, Tom Blyth cast in Netflix adaptation of 'People We Meet on Vacation'
Ranking
- Residents worried after ceiling cracks appear following reroofing works at Jalan Tenaga HDB blocks
- At Paris Games, athletes can't stop talking about food at Olympic Village
- UAW leader says Trump would send the labor movement into reverse if he’s elected again
- WWE SummerSlam 2024: Time, how to watch, match card and more
- Opinion: Gianni Infantino, FIFA sell souls and 2034 World Cup for Saudi Arabia's billions
- When does Noah Lyles race? Olympic 100 race schedule, results Saturday
- What that killer 'Trap' ending says about a potential sequel (Spoilers!)
- Sept. 11 families group leader cheers restoration of death penalty option in 9-11 prosecutions
Recommendation
New Orleans mayor’s former bodyguard making first court appearance after July indictment
After a Study Found Lead in Tampons, Environmentalists Wonder if Global Metal Pollution Is Worse Than They Previously Thought
After a Study Found Lead in Tampons, Environmentalists Wonder if Global Metal Pollution Is Worse Than They Previously Thought
Netherlands' Femke Bol steals 4x400 mixed relay win from Team USA in Paris Olympics
Sarah J. Maas books explained: How to read 'ACOTAR,' 'Throne of Glass' in order.
What to watch: Workin' on our Night moves
Emily Bader, Tom Blyth cast in Netflix adaptation of 'People We Meet on Vacation'
International Seabed Authority elects new secretary general amid concerns over deep-sea mining
Like
- The FTC says 'gamified' online job scams by WhatsApp and text on the rise. What to know.
- After a Study Found Lead in Tampons, Environmentalists Wonder if Global Metal Pollution Is Worse Than They Previously Thought
- Why Simone Biles is leaving the door open to compete at 2028 Olympics: 'Never say never'